A Combinatorial-Probabilistic Diagnostic Entropy and Information

نویسنده

  • Henryk Borowczyk
چکیده

Abstract A new combinatorial-probabilistic diagnostic entropy has been introduced. It describes the pair-wise sum of probabilities of system conditions that have to be distinguished during the diagnosing process. The proposed measure describes the uncertainty of the system conditions, and at the same time complexity of the diagnosis problem. Treating the assumed combinatorialdiagnostic entropy as a primary notion, the information delivered by the symptoms has been defined. The relationships have been derived to facilitate explicit, quantitative assessment of the information of a single symptom as well as that of a symptoms set. It has been proved that the combinatorial-probabilistic information shows the property of additivity. The presented measures are focused on diagnosis problem, but they can be easily applied to other disciplines such as decision theory and classification.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy

This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribu...

متن کامل

Origins of the Combinatorial Basis of Entropy

The combinatorial basis of entropy, given by Boltzmann, can be written H = N−1 lnW, where H is the dimensionless entropy, N is the number of entities and W is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: H = κ(φ(W) +C) and D = −κ(φ(P...

متن کامل

A pr 2 00 7 Combinatorial Information Theory : I . Philosophical Basis of Cross - Entropy and Entropy

This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribu...

متن کامل

Combinatorial entropies and statistics

We examine the combinatorial or probabilistic definition (“Boltzmann’s principle”) of the entropy or cross-entropy function H ∝ lnW or D ∝ − lnP, where W is the statistical weight and P the probability of a given realization of a system. Extremisation of H or D, subject to any constraints, thus selects the “most probable” (MaxProb) realization. If the system is multinomial, D converges asymptot...

متن کامل

Control System Diagnosis Algorithm Optimization - the Combinatorial Entropy Approach

This paper presents combinatorial measures of the system condition uncertainty (diagnostic entropy) and the diagnostic symptoms information. The multi-valued diagnostic model has been assumed. Proposed measures can be used for the diagnostic model analysis and the diagnosis algorithm optimization. Optimization method exploits indispensable symptoms at every stage of the optimization process pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/0810.5535  شماره 

صفحات  -

تاریخ انتشار 2008